76 research outputs found

    Wideband Waveform Design for Robust Target Detection

    Full text link
    Future radar systems are expected to use waveforms of a high bandwidth, where the main advantage is an improved range resolution. In this paper, a technique to design robust wideband waveforms for a Multiple-Input-Single-Output system is developed. The context is optimal detection of a single object with partially unknown parameters. The waveforms are robust in the sense that, for a single transmission, detection capability is maintained over an interval of time-delay and time-scaling (Doppler) parameters. A solution framework is derived, approximated, and formulated as an optimization by means of basis expansion. In terms of probabilities of detection and false alarm, numerical evaluation shows the efficiency of the proposed method when compared with a Linear Frequency Modulated signal and a Gaussian pulse.Comment: This paper is submitted for peer review to IEEE letters on signal processin

    Demystifying Deep Learning: A Geometric Approach to Iterative Projections

    Full text link
    Parametric approaches to Learning, such as deep learning (DL), are highly popular in nonlinear regression, in spite of their extremely difficult training with their increasing complexity (e.g. number of layers in DL). In this paper, we present an alternative semi-parametric framework which foregoes the ordinarily required feedback, by introducing the novel idea of geometric regularization. We show that certain deep learning techniques such as residual network (ResNet) architecture are closely related to our approach. Hence, our technique can be used to analyze these types of deep learning. Moreover, we present preliminary results which confirm that our approach can be easily trained to obtain complex structures.Comment: To be appeared in the ICASSP 2018 proceeding

    Fast LASSO based DOA tracking

    Get PDF
    In this paper, we propose a sequential, fast DOA tracking technique using the measurements of a uniform linear sensor array in the far field of a set of narrow band sources. Our approach is based on sparse approximation technique LASSO (Least Absolute Shrincage and Selection Operator), which has recently gained considerable interest for DOA and other estimation problems. Considering the LASSO optimization as a Bayesian estimation, we first define a class of prior distributions suitable for the sparse representation of the model and discuss its relation to the priors over DOAs and waveforms. Inspired by the Kalman filtering method, we introduce a nonlinear sequential filter on this family of distributions. We derive the filter for a simple random walk motion model of the DOAs. The method consists of consecutive implementation of weighted LASSO optimizations using each new measurement and updating the LASSO weights for the next step

    Deep Dictionary Learning: A PARametric NETwork Approach

    Full text link
    Deep dictionary learning seeks multiple dictionaries at different image scales to capture complementary coherent characteristics. We propose a method for learning a hierarchy of synthesis dictionaries with an image classification goal. The dictionaries and classification parameters are trained by a classification objective, and the sparse features are extracted by reducing a reconstruction loss in each layer. The reconstruction objectives in some sense regularize the classification problem and inject source signal information in the extracted features. The performance of the proposed hierarchical method increases by adding more layers, which consequently makes this model easier to tune and adapt. The proposed algorithm furthermore, shows remarkably lower fooling rate in presence of adversarial perturbation. The validation of the proposed approach is based on its classification performance using four benchmark datasets and is compared to a CNN of similar size

    Maximum a Posteriori Based Regularization Parameter Selection

    Get PDF
    The l(1) norm regularized least square technique has been proposed as an efficient method to calculate sparse solutions. However, the choice of the regularization parameter is still an unsolved problem, especially when the number of nonzero elements is unknown. In this paper we first design different ML estimators by interpreting the l(1) norm regularization as a MAP estimator with a Laplacian model for data. We also utilize the MDL criterion to decide on the regularization parameter. The performance of these new methods are evaluated in the context of estimating the Directions Of Arrival (DOA) for the simulated data and compared. The simulations show that the performance of the different forms of the MAP estimator are approximately equal in the one snapshot case, where MDL may not work. But for the multiple snapshot case both methods can be used

    On the resolution of the LASSO-based DOA estimation method

    Get PDF
    This paper investigates the consistency of the LASSO-based DOA estimation of the narrow-band signals in infinitely high SNR. Such a method provides a robust and accurate approximation of the Maximum Likelihood estimation. However, as we show, unlike the standard techniques such as subspace methods the LASSO-based estimation is generally not consistent in high SNRs. In return, considering the true DOA's, we show that the method is consistent for certain configuration of the sources. This approach leads us to relate such a conditional consistency to the resolution concept. We next give a condition to verify the consistency of a given set of directions and simplify it to a computationally fast equivalent algorithm. The results show that the resolution in infinitely high SNR case for m sensors decreases by speed 1 over m

    A Universal Analysis of Large-Scale Regularized Least Squares Solutions

    Get PDF
    A problem that has been of recent interest in statistical inference, machine learning and signal processing is that of understanding the asymptotic behavior of regularized least squares solutions under random measurement matrices (or dictionaries). The Least Absolute Shrinkage and Selection Operator (LASSO or least-squares with â„“_1 regularization) is perhaps one of the most interesting examples. Precise expressions for the asymptotic performance of LASSO have been obtained for a number of different cases, in particular when the elements of the dictionary matrix are sampled independently from a Gaussian distribution. It has also been empirically observed that the resulting expressions remain valid when the entries of the dictionary matrix are independently sampled from certain non-Gaussian distributions. In this paper, we confirm these observations theoretically when the distribution is sub-Gaussian. We further generalize the previous expressions for a broader family of regularization functions and under milder conditions on the underlying random, possibly non-Gaussian, dictionary matrix. In particular, we establish the universality of the asymptotic statistics (e.g., the average quadratic risk) of LASSO with non-Gaussian dictionaries
    • …
    corecore